Error bounds of MCMC for functions with unbounded stationary variance
نویسندگان
چکیده
منابع مشابه
Generalization error bounds for stationary autoregressive models
We derive generalization error bounds for stationary univariate autoregressive (AR) models. We show that imposing stationarity is enough to control the Gaussian complexity without further regularization. This lets us use structural risk minimization for model selection. We demonstrate our methods by predicting interest rate movements.
متن کاملNonasymptotic bounds on the estimation error for regenerative MCMC algorithms∗
MCMC methods are used in Bayesian statistics not only to sample from posterior distributions but also to estimate expectations. Underlying functions are most often defined on a continuous state space and can be unbounded. We consider a regenerative setting and Monte Carlo estimators based on i.i.d. blocks of a Markov chain trajectory. The main result is an inequality for the mean square error. ...
متن کاملUnbounded-error quantum computation with small space bounds
We prove the following facts about the language recognition power of quantum Turing machines (QTMs) in the unbounded error setting: QTMs are strictly more powerful than probabilistic Turing machines for any common space bound s satisfying s(n) = o(log log n). For " one-way " Turing machines, where the input tape head is not allowed to move left, the above result holds for s(n) = o(log n). We al...
متن کاملUnbounded-Error Communication Complexity of Symmetric Functions
The sign-rank of a real matrix M is the least rank of a matrix R in which every entry has the same sign as the corresponding entry of M.We determine the sign-rank of every matrix of the form M = [ D(|x ∧ y|) ]x,y, where D : {0, 1, . . . , n} → {−1,+1} is given and x and y range over {0, 1}n . Specifically, we prove that the sign-rank of M equals 22̃(k), where k is the number of times D changes s...
متن کاملRelative Deviation Learning Bounds and Generalization with Unbounded Loss Functions
We present an extensive analysis of relative deviation bounds, including detailed proofs of twosided inequalities and their implications. We also give detailed proofs of two-sided generalization bounds that hold in the general case of unbounded loss functions, under the assumption that a moment of the loss is bounded. These bounds are useful in the analysis of importance weighting and other lea...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Statistics & Probability Letters
سال: 2015
ISSN: 0167-7152
DOI: 10.1016/j.spl.2014.07.035